702 research outputs found

    On the equivalence between standard and sequentially ordered hidden Markov models

    Full text link
    Chopin (2007) introduced a sequentially ordered hidden Markov model, for which states are ordered according to their order of appearance, and claimed that such a model is a re-parametrisation of a standard Markov model. This note gives a formal proof that this equivalence holds in Bayesian terms, as both formulations generate equivalent posterior distributions, but does not hold in Frequentist terms, as both formulations generate incompatible likelihood functions. Perhaps surprisingly, this shows that Bayesian re-parametrisation and Frequentist re-parametrisation are not identical concepts

    Data Reductions and Combinatorial Bounds for Improved Approximation Algorithms

    Full text link
    Kernelization algorithms in the context of Parameterized Complexity are often based on a combination of reduction rules and combinatorial insights. We will expose in this paper a similar strategy for obtaining polynomial-time approximation algorithms. Our method features the use of approximation-preserving reductions, akin to the notion of parameterized reductions. We exemplify this method to obtain the currently best approximation algorithms for \textsc{Harmless Set}, \textsc{Differential} and \textsc{Multiple Nonblocker}, all of them can be considered in the context of securing networks or information propagation

    An Empirical Examination of Compensation of REIT Managers

    Get PDF
    Principal-agent literature finds that manager and owner incentives can be aligned with performance contingent contracts. We investigate the compensation of Real Estate Investment Trust (REIT) industry executives. The competitive nature of mortgage and equity markets, in conjunction with the corporate tax exemption available when REITs distribute most of their earnings as dividends, is likely to influence the compensation of REIT managers. Executive compensation is modeled as a function of revenues and unexpected profit. After transforming the model to reduce collinearity and heteroskedasticity, we find compensation to be generally positively related to revenue. We also find unexpected profit to be generally insignificantly related to compensation, but positively related in those cases where it is significant.

    Mortgage Lenders' Market Response to a Landmark Regulatory Decision Based on Fair Lending Compliance

    Get PDF
    Regulation of real estate lending has substantially increased in the past decade. Government efforts to improve compliance with Community Reinvestment Act mandates are evidence of increased emphasis on racial equal opportunity in loan origination. To investigate the impact of these efforts, this paper examines the Federal Reserve Bank rejection of Shawmut National Corporation's application to buy New Dartmouth Bank. Rejection was based on Shawmut's poor compliance with fair-lending guidelines. Testing finds significant negative abnormal stock returns for samples of mortgage lenders on the announcement day of Shawmut's application rejection. In addition, cross-sectional analysis reveals an inverse relationship between national banks' cumulative abnormal returns (CARs) and a measure of fair lending.

    Bayesian optimization using sequential Monte Carlo

    Full text link
    We consider the problem of optimizing a real-valued continuous function ff using a Bayesian approach, where the evaluations of ff are chosen sequentially by combining prior information about ff, which is described by a random process model, and past evaluation results. The main difficulty with this approach is to be able to compute the posterior distributions of quantities of interest which are used to choose evaluation points. In this article, we decide to use a Sequential Monte Carlo (SMC) approach

    Reclaiming human machine nature

    Get PDF
    Extending and modifying his domain of life by artifact production is one of the main characteristics of humankind. From the first hominid, who used a wood stick or a stone for extending his upper limbs and augmenting his gesture strength, to current systems engineers who used technologies for augmenting human cognition, perception and action, extending human body capabilities remains a big issue. From more than fifty years cybernetics, computer and cognitive sciences have imposed only one reductionist model of human machine systems: cognitive systems. Inspired by philosophy, behaviorist psychology and the information treatment metaphor, the cognitive system paradigm requires a function view and a functional analysis in human systems design process. According that design approach, human have been reduced to his metaphysical and functional properties in a new dualism. Human body requirements have been left to physical ergonomics or "physiology". With multidisciplinary convergence, the issues of "human-machine" systems and "human artifacts" evolve. The loss of biological and social boundaries between human organisms and interactive and informational physical artifact questions the current engineering methods and ergonomic design of cognitive systems. New developpment of human machine systems for intensive care, human space activities or bio-engineering sytems requires grounding human systems design on a renewed epistemological framework for future human systems model and evidence based "bio-engineering". In that context, reclaiming human factors, augmented human and human machine nature is a necessityComment: Published in HCI International 2014, Heraklion : Greece (2014

    Properties of Nested Sampling

    Full text link
    Nested sampling is a simulation method for approximating marginal likelihoods proposed by Skilling (2006). We establish that nested sampling has an approximation error that vanishes at the standard Monte Carlo rate and that this error is asymptotically Gaussian. We show that the asymptotic variance of the nested sampling approximation typically grows linearly with the dimension of the parameter. We discuss the applicability and efficiency of nested sampling in realistic problems, and we compare it with two current methods for computing marginal likelihood. We propose an extension that avoids resorting to Markov chain Monte Carlo to obtain the simulated points.Comment: Revision submitted to Biometrik

    Sequential quasi-Monte Carlo: Introduction for Non-Experts, Dimension Reduction, Application to Partly Observed Diffusion Processes

    Full text link
    SMC (Sequential Monte Carlo) is a class of Monte Carlo algorithms for filtering and related sequential problems. Gerber and Chopin (2015) introduced SQMC (Sequential quasi-Monte Carlo), a QMC version of SMC. This paper has two objectives: (a) to introduce Sequential Monte Carlo to the QMC community, whose members are usually less familiar with state-space models and particle filtering; (b) to extend SQMC to the filtering of continuous-time state-space models, where the latent process is a diffusion. A recurring point in the paper will be the notion of dimension reduction, that is how to implement SQMC in such a way that it provides good performance despite the high dimension of the problem.Comment: To be published in the proceedings of MCMQMC 201

    Harold Jeffreys's Theory of Probability Revisited

    Full text link
    Published exactly seventy years ago, Jeffreys's Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigor. In this paper we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.Comment: This paper commented in: [arXiv:1001.2967], [arXiv:1001.2968], [arXiv:1001.2970], [arXiv:1001.2975], [arXiv:1001.2985], [arXiv:1001.3073]. Rejoinder in [arXiv:0909.1008]. Published in at http://dx.doi.org/10.1214/09-STS284 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Kernel Sequential Monte Carlo

    Get PDF
    We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities. KSMC is a family of sequential Monte Carlo algorithms that are based on building emulator models of the current particle system in a reproducing kernel Hilbert space. We here focus on modelling nonlinear covariance structure and gradients of the target. The emulator’s geometry is adaptively updated and subsequently used to inform local proposals. Unlike in adaptive Markov chain Monte Carlo, continuous adaptation does not compromise convergence of the sampler. KSMC combines the strengths of sequental Monte Carlo and kernel methods: superior performance for multimodal targets and the ability to estimate model evidence as compared to Markov chain Monte Carlo, and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity. As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive. We describe necessary tuning details and demonstrate the benefits of the the proposed methodology on a series of challenging synthetic and real-world examples
    corecore